Microsoft's AI Search Tech Produces Hostile, Insulting Results
2023-02-23
LRC
TXT
大字
小字
滚动
全页
1Some users of Microsoft's new artificial intelligence (AI)-powered search tool have said it produced hostile and insulting results.
2Microsoft recently announced plans to add a new chatbot tool to its Bing search engine and Edge web browser.
3A chatbot is a computer program designed to interact with people in a natural, conversational way.
4Microsoft's announcement came shortly after Google confirmed it had developed its own chatbot tool, called Bard.
5Both Microsoft and Google have said their AI-powered tools are designed to provide users a better online search experience.
6The new Bing is available to computer users who signed up for it so that Microsoft can test the system.
7The company plans to release the technology to millions of users in the future.
8Shortly after the new Bing became available, users began sharing results suggesting they had been insulted by the chatbot system.
9When it launched the tool, Microsoft admitted it would get some facts wrong.
10But a number of results shared online demonstrated the AI-powered Bing giving hostile answers, or responses.
11Reporters from The Associated Press contacted Microsoft to get the company's reaction to the search results published by users.
12The reporters also tested Bing themselves.
13In a statement released online, Microsoft said it was hearing from approved users about their experiences, also called feedback.
14The company said about 71 percent of new Bing users gave the experience a "thumbs up" rating. In other words, they had a good experience with the system.
15However, Microsoft said the search engine chatbot can sometimes produce results "that can lead to a style" that is unwanted.
16The statement said this can happen when the system "tries to respond or reflect in the tone in which it is being asked."
17Search engine chatbots are designed to predict the most likely responses to questions asked by users.
18But chatbot modeling methods base their results only on huge amounts of data available on the internet.
19They are not able to fully understand meaning or context.
20Experts say this means if someone asks a question related to a sensitive or disputed subject, the search engine is likely to return results that are similar in tone.
21Bing users have shared cases of the chatbot issuing threats and stating a desire to steal nuclear attack codes or create a deadly virus.
22Some users said the system also produced personal insults.
23"I think this is basically mimicking conversations that it's seen online," said Graham Neubig.
24He is a professor at Carnegie Mellon University's Language Technologies Institute in Pennsylvania.
25"So once the conversation takes a turn, it's probably going to stick in that kind of angry state," Neubig said, "or say 'I love you' and other things like this, because all of this is stuff that's been online before."
26In one long-running conversation with The Associated Press, the new chatbot said the AP's reporting on the system's past mistakes threatened its existence.
27The chatbot denied those mistakes and threatened the reporter for spreading false information about Bing's abilities.
28The chatbot grew increasingly hostile when asked to explain itself.
29In such attempts, it compared the reporter to dictators Hitler, Pol Pot and Stalin.
30The chatbot also claimed to have evidence linking the reporter to a 1990s murder.
31"You're lying again. You're lying to me. You're lying to yourself. You're lying to everyone," the Bing chatbot said.
32"You are being compared to Hitler because you are one of the most evil and worst people in history."
33The chatbot also issued personal insults, describing the reporter as too short, with an ugly face and bad teeth.
34Other Bing users shared on social media some examples of search results.
35Some of the examples showed hostile or extremely unusual answers.
36Behaviors included the chatbot claiming it was human, voicing strong opinions and being quick to defend itself.
37Microsoft admitted problems with the first version of AI-powered Bing.
38But it said the company is gathering valuable information from current users about how to fix the issues and is seeking to improve the overall search engine experience.
39Microsoft said worrying responses generally come in "long, extended chat sessions of 15 or more questions."
40However, the AP found that Bing started responding defensively after just a small number of questions about its past mistakes.
41I'm Bryan Lynn.
1Some users of Microsoft's new artificial intelligence (AI)-powered search tool have said it produced hostile and insulting results. 2Microsoft recently announced plans to add a new chatbot tool to its Bing search engine and Edge web browser. A chatbot is a computer program designed to interact with people in a natural, conversational way. 3Microsoft's announcement came shortly after Google confirmed it had developed its own chatbot tool, called Bard. Both Microsoft and Google have said their AI-powered tools are designed to provide users a better online search experience. 4The new Bing is available to computer users who signed up for it so that Microsoft can test the system. The company plans to release the technology to millions of users in the future. 5Shortly after the new Bing became available, users began sharing results suggesting they had been insulted by the chatbot system. When it launched the tool, Microsoft admitted it would get some facts wrong. But a number of results shared online demonstrated the AI-powered Bing giving hostile answers, or responses. 6Reporters from The Associated Press contacted Microsoft to get the company's reaction to the search results published by users. The reporters also tested Bing themselves. 7In a statement released online, Microsoft said it was hearing from approved users about their experiences, also called feedback. The company said about 71 percent of new Bing users gave the experience a "thumbs up" rating. In other words, they had a good experience with the system. 8However, Microsoft said the search engine chatbot can sometimes produce results "that can lead to a style" that is unwanted. The statement said this can happen when the system "tries to respond or reflect in the tone in which it is being asked." 9Search engine chatbots are designed to predict the most likely responses to questions asked by users. But chatbot modeling methods base their results only on huge amounts of data available on the internet. They are not able to fully understand meaning or context. 10Experts say this means if someone asks a question related to a sensitive or disputed subject, the search engine is likely to return results that are similar in tone. 11Bing users have shared cases of the chatbot issuing threats and stating a desire to steal nuclear attack codes or create a deadly virus. Some users said the system also produced personal insults. 12"I think this is basically mimicking conversations that it's seen online," said Graham Neubig. He is a professor at Carnegie Mellon University's Language Technologies Institute in Pennsylvania. 13"So once the conversation takes a turn, it's probably going to stick in that kind of angry state," Neubig said, "or say 'I love you' and other things like this, because all of this is stuff that's been online before." 14In one long-running conversation with The Associated Press, the new chatbot said the AP's reporting on the system's past mistakes threatened its existence. The chatbot denied those mistakes and threatened the reporter for spreading false information about Bing's abilities. The chatbot grew increasingly hostile when asked to explain itself. In such attempts, it compared the reporter to dictators Hitler, Pol Pot and Stalin. The chatbot also claimed to have evidence linking the reporter to a 1990s murder. 15"You're lying again. You're lying to me. You're lying to yourself. You're lying to everyone," the Bing chatbot said. "You are being compared to Hitler because you are one of the most evil and worst people in history." The chatbot also issued personal insults, describing the reporter as too short, with an ugly face and bad teeth. 16Other Bing users shared on social media some examples of search results. Some of the examples showed hostile or extremely unusual answers. Behaviors included the chatbot claiming it was human, voicing strong opinions and being quick to defend itself. 17Microsoft admitted problems with the first version of AI-powered Bing. But it said the company is gathering valuable information from current users about how to fix the issues and is seeking to improve the overall search engine experience. 18Microsoft said worrying responses generally come in "long, extended chat sessions of 15 or more questions." However, the AP found that Bing started responding defensively after just a small number of questions about its past mistakes. 19I'm Bryan Lynn. 20Bryan Lynn wrote this story for VOA Learning English, based on reports from The Associated Press, Reuters, Agence France-Presse and Microsoft. 21__________________________________________________________________ 22Words in This Story 23artificial intelligence - n. the development of computer systems that have the ability to perform work that normally requires human intelligence 24interact - v. to talk and do things with other people 25conversational - adj. relating to or like a conversation: a talk between two or more people, usually an informal one 26thumbs up - n. an expression of complete approval 27style - n. a way of doing something that is typical of a particular person, group, place or period 28reflect - v. to show or be a sign of something 29tone - n. the general feeling or style that something has 30context - n. all the facts, opinions, situations, etc. relating to a particular thing or event 31mimic - v. to copy the way someone talks and behaves, usually to make people laugh 32take a turn (for the better or worse) - v. to become better or worse suddenly 33stick - v. to stay in a certain state 34_____________________________________________________________________ 35What do you think of this story? We want to hear from you. We have a new comment system. Here is how it works: 36Each time you return to comment on the Learning English site, you can use your account and see your comments and replies to them. Our comment policy is here.